Interval Scheduling Algorithm

The Interval Scheduling Algorithm is a widely-used optimization technique that aims to maximize the number of non-overlapping tasks or events that can be scheduled within a given time period. This algorithm is particularly useful in real-world applications such as resource allocation, meeting room scheduling, and job scheduling. The problem can be formally defined as follows: given a set of tasks or events, each with a start time and finish time, find the largest set of mutually compatible tasks or events, i.e., no two tasks or events in the set overlap in time. A crucial aspect of this problem is that the tasks or events cannot be divided or split into smaller parts. The most common approach to solve the interval scheduling problem is the greedy algorithm, which selects tasks or events based on their finish times. The algorithm works by first sorting the tasks or events by their finish times in non-decreasing order. Then, the algorithm iteratively selects the task or event with the earliest finish time that does not conflict with any previously selected tasks or events, and adds it to the set of mutually compatible tasks or events. The greedy algorithm guarantees an optimal solution for the interval scheduling problem, as it always selects the tasks or events that leave the maximum remaining time for other tasks or events. This approach has a time complexity of O(n log n) due to sorting, making it efficient for large-scale applications.
/*
 Petar 'PetarV' Velickovic
 Algorithm: Interval Scheduling
*/

#include <stdio.h>
#include <math.h>
#include <string.h>
#include <iostream>
#include <vector>
#include <list>
#include <string>
#include <algorithm>
#include <queue>
#include <stack>
#include <set>
#include <map>
#include <complex>
#define MAX_N 100001
using namespace std;
typedef long long lld;

int n;

struct Interval
{
    int L, R;
    bool operator <(const Interval &a) const
    {
        if (R != a.R) return (R < a.R);
        return (L < a.L);
    }
};
Interval I[MAX_N];

//Algoritam koji odredjuje maksimalan broj disjunktnih intervala iz nekog skupa
//Slozenost: O(n log n)

inline int ScheduleIntervals()
{
    sort(I, I+n);
    
    int ret = 1;
    int currentEnd = I[0].R;
    
    for (int i=1;i<n;i++)
    {
        if (I[i].L >= currentEnd)
        {
            currentEnd = I[i].R;
            ret++;
        }
    }
    
    return ret;
}

int main()
{
    n = 4;
    
    I[0].L = -1, I[0].R = 1;
    I[1].L = 0, I[1].R = 5;
    I[2].L = 2, I[2].R = 3;
    I[3].L = 5, I[3].R = 9;
    
    printf("%d\n",ScheduleIntervals());
    return 0;
}

LANGUAGE:

DARK MODE: